AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
6 billion parameters

# 6 billion parameters

Gpt J 6b
Apache-2.0
GPT-J 6B is a 6-billion-parameter autoregressive language model trained using the Mesh Transformer JAX framework, employing the same tokenizer as GPT-2/3.
Large Language Model English
G
EleutherAI
297.31k
1,493
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase